285 research outputs found

    Caractérisation de la sûreté de fonctionnement de systèmes à base d'intergiciel

    Get PDF
    Les systèmes critiques sont soumis, comme le reste de l'industrie informatique, à des contraintes de coût de plus en plus sévères. Cette pression pousse les développeurs à privilégier la réutilisation de logiciels, plutôt que de procéder à des développements spécifiques à chaque projet. Cette tendance à l'utilisation de composants logiciels "sur étagère", souvent développés par des tiers, est renforcée par des besoins technologiques de plus en plus complexes, en particulier l'intégration des systèmes dans des réseaux de communication. L'utilisation de composants sur étagère permet aux industriels de se concentrer sur leur domaine de compétence, sans gaspiller de l'effort à redévelopper des fonctions qui ont déjà été implantées dans d'autres secteurs. Cette tendance à la réutilisation, ainsi que l'interconnexion croissante des systèmes, a favorisé l'émergence de standards d'interface, qui permettent l'interopérabilité de systèmes, même lorsqu'ils sont développés par des groupes différents. L'un des standards d'interface pour l'intégration de systèmes est les intergiciels de communication tels que la plate-forme CORBA. Ces intergiciels facilitent l'interaction entre des applications disparates, s'exécutant sur des plates-formes matérielles et logicielles hétérogènes. Pour les intégrateurs de systèmes distribués, ces technologies sont attractives pour plusieurs raisons, autant technologiques qu'économiques : elles constituent un moyen rapide d'intégration de nouvelles technologies, d'augmentation de la souplesse et l'ouverture vers d'autres systèmes. Toutefois, cette attractivité est conditionnée par des craintes concernant la robustesse des composants intergiciels, qui n'ont pas bénéficié de la rigueur du processus de développement utilisé dans le contexte de systèmes critiques. Les intégrateurs de systèmes distribués semi-critiques souhaitent avoir des assurances sur la qualité et la robustesse des composants qu'ils intègrent au sein de leurs systèmes. Ils souhaitent des informations sur les modes de défaillance de l'intergiciel, sur les canaux de propagation d'erreur qu'il introduit. Ils souhaitent avoir des informations quantitatives leur permettant de comparer différentes implémentations candidates du point de vue de la sûreté de fonctionnement, afin de sélectionner le candidat qui est le mieux adapté à leurs besoins. La problématique que nous venons d'énoncer peut se résumer en deux points : - obtenir une meilleure connaissance des types de fautes et d'erreurs qui existent dans les systèmes à base d'intergiciel de communication ; - développer une méthode permettant de caractériser expérimentalement la robustesse de candidats intergiciels. Il existe actuellement très peu de travaux qui permettent de répondre à ces interrogations. L'objectif de cette thèse est de proposer une méthode de caractérisation qui puisse être appliquée à des intergiciels cibles, et de répondre ainsi aux problèmes des intégrateurs de systèmes semi-critiques, ainsi qu'aux développeurs de composants intergiciel. Notre contribution est de proposer une méthodologie pour l'analyse de la sûreté de fonctionnement d'un intergiciel. Notre méthode est basée sur une analyse structurelle des intergiciels de communication, sur l'élaboration d'un modèle de fautes, une classification des modes de défaillance, et le développement d'un ensemble de techniques d'injection de faute adaptées à l'intergiciel. Nous avons validé notre approche en menant des campagnes d'injection de faute ciblant plusieurs implémentations de la norme CORBA. ABSTRACT : We propose a method for the dependability assessment and failure mode characterization of communications middleware. The method is based on the structural analysis of communications-oriented middleware, the identification of a fault model, a failure modes classification, and the development of a number of fault injection techniques that can be used to target middleware implementations. We have applied our method by carrying out fault injection campaigns targeting a number of CORBA implementations, and obtained quantitative measures of the robustness of the different candidates. Our work allows integrators of dependable distributed systems to obtain assurances on the robustness of the software components they place at the heart of their systems, and provides information to middleware vendors regarding robustness failings in their products

    What We Owe Workers as a Matter of Common Humanity: Sickness and Caregiving Leaves and Pay in the Age of Pandemics

    Get PDF
    Workers commodifying their time in labour markets are liable to become temporarily incapable of doing so because of sickness or caregiving responsibilities. While the risk is universal, it will be experienced very differently depending on social conditions and arrangements and social locations, such as gender, among others. In a society in which the vast majority of people are dependent on labour market incomes to survive, the consequences of being off work are severe, unless some protection and benefits are provided. Over time, Canada has developed a number leave and income-replacement schemes, but the COVID-19 pandemic revealed, in dramatic fashion, their limitations, leading to the adoption of temporary measures to address the crisis. This article, written from a feminist political economy perspective, provides an overview of the historical development of sickness and caregiving leave and pay arrangements set against the background of changing social and economic reproduction regimes. It then examines more closely the slow development of Canada’s welfare state model of sickness and caregiving leaves and benefits since the 1970s, focusing on the federal government’s enactment of special employment insurance benefits and statutory leave rights in British Columbia and Ontario. Next, it critically examines the limitations of that statutory regime, as it existed immediately prior to the outbreak of the COVID-19 pandemic in Canada, and then considers the expansion of sick and caregiving leave and pay provisions, enacted in response to the pandemic. The article then elaborates four principles to guide the future development of the sick and caregiving entitlements suggests ways of bringing the existing regime more into line with those principles. Finally, it sets out a few directions towards imagining a different regime that truly provides workers with what we conceive they are owed as a matter of common humanity

    Toxic release dispersion modelling with PHAST : parametric sensitivity analysis

    Get PDF
    Recent changes to French legislation, concerning the prevention of technological and natural risk, require industrial sites to calculate the safety perimeters for different accident scenarios, based on a detailed probabilistic risk assessment. It is important that the safety perimeters resulting from risk assessment studies are based on the best scientific knowledge available, and that the level of uncertainty is minimised. A significant contribution to the calculation of the safety perimeters comes from the modelling of atmospheric dispersion, particularly of the accidental release of toxic products. One of the most widely used tools for dispersion modelling in several European countries is PHASTTM[1]. This software application is quite flexible, allowing the user to alter values for a wide range of model parameters. Users of the software have found that simulation results may depend quite strongly on the values chosen for some of these parameters. While this flexibility is useful, it can lead different users to calculate effect distances that vary considerably even when studying the same scenario. In order better to understand the influence of these input parameters, we have carried out a parametric sensitivity study of the PHAST dispersion models. This allows us to obtain global sensitivity indices for the input parameters, which quantify the level of influence of each parameter on the output of the model and the interactions. The FAST (Fourier Amplitude Sensitivity Test) sensitivity analysis method that we have applied (using the SimLab software tool [2]) provides both first order indices (that characterize the parameter’s influence on the model output when it varies in isolation) and total indices (that characterize one parameter’s influence including its joint interaction with other input parameters). We shall present results of this analysis on a number of toxic gas dispersion scenarios. The analysis has considered parameters related to the physical release scenario (release rate, release height, etc.), to weather conditions (wind speed, stability class, atmospheric temperature, etc.) and to the numerical resolution (step size, etc.). We compare the results of several sensitivity analysis methods, both local one-at-a-time methods and global methods. We discuss the importance of selecting an appropriate model output value when studying the model sensitivity (output measures considered include the concentration of the released gas at a long distance, at a short distance, and the maximal distance at which a specified concentration is attained). Our experimental results assume that input parameters to the dispersion model are independent. However, correlations exist between several input parameters that we have analysed, such as wind speed and atmospheric stability class. We discuss various approaches to calculate sensitivity indices that take this correlation into account. [1] DNV Software, London, UK [2] Saltelli, A., Chan, K., Scott, E. M. Sensitivity Analysis, 2004, John Wiley & Sons Publishers

    Federal Enforcement of Migrant Workers’ Labour Rights in Canada: A Research Report

    Get PDF
    Although Canada’s migrant labour program is seen by some as a model of best practices, rights shortfalls and exploitation of workers are well documented. Through migration policy, federal authorities determine who can hire migrant workers, and the conditions under which they are employed, through the provision of work permits. Despite its authority over work permits, the federal government has historically had little to do with the regulation of working conditions. In 2015, the federal government introduced a new regulatory enforcement system - unique internationally for its attempt to enforce migrants’ workplace rights through federal migration policy - under which employers must comply with contractual employment terms, uphold provincial workplace standards, and make efforts to maintain a workplace free of abuse. Drawing on enforcement data, and frontline law and policy documents, we critically assess the new enforcement system, concluding that it holds both promise and peril for migrant workers

    A Simultaneous Stacking and Deblending Algorithm for Astronomical Images

    Full text link
    Stacking analysis is a means of detecting faint sources using a priori position information to estimate an aggregate signal from individually undetected objects. Confusion severely limits the effectiveness of stacking in deep surveys with limited angular resolution, particularly at far infrared to submillimeter wavelengths, and causes a bias in stacking results. Deblending corrects measured fluxes for confusion from adjacent sources; however, we find that standard deblending methods only reduce the bias by roughly a factor of two while tripling the variance. We present an improved algorithm for simultaneous stacking and deblending that greatly reduces bias in the flux estimate with nearly minimum variance. When confusion from neighboring sources is the dominant error, our method improves upon RMS error by at least a factor of three and as much as an order of magnitude compared to other algorithms. This improvement will be useful for Herschel and other telescopes working in a source confused, low signal to noise regime.Comment: accepted to The Astronomical Journal. 18 pages, 6 figure

    Stellar Spectropolarimetry

    Get PDF

    Accident Investigation and Learning to Improve Safety Management in Complex System: Remaining Challenges: Proceedings of the 55th ESReDA Seminar

    Get PDF
    Accident investigation and learning from events are fundamental processes in safety management, involving technical, human, organisational and societal dimensions. The European Safety, Reliability and Data Association, ESReDA, has a long tradition in gathering together experts in the field to work together, and to share and explore experiences of using various paradigms, approaches, methods databased and implementation of safety systems across different industries. The 55th ESReDA seminar on “Accident Investigation and Learning to Improve Safety Management in Complex System: Remaining Challenges” attracted more than 80 participants from industry, authorities, operators, research centres and academia. The seminar programme consisted of 22 technical papers, three keynote speeches and a workshop to debate about the remaining challenges of accident investigation and potential innovative breakthroughs.JRC.G.10-Knowledge for Nuclear Security and Safet

    The Atacama Cosmology Telescope: Dusty Star-Forming Galaxies and Active Galactic Nuclei in the Southern Survey

    Get PDF
    We present a catalog of 191 extragalactic sources detected by the Atacama Cosmology Telescope (ACT) at 148 GHz and/or 218 GHz in the 2008 Southern survey. Flux densities span 14-1700 mJy, and we use source spectral indices derived using ACT-only data to divide our sources into two sub-populations: 167 radio galaxies powered by central active galactic nuclei (AGN), and 24 dusty star-forming galaxies (DSFGs). We cross-identify 97% of our sources (166 of the AGN and 19 of the DSFGs) with those in currently available catalogs. When combined with flux densities from the Australian Telescope 20 GHz survey and follow-up observations with the Australia Telescope Compact Array, the synchrotron-dominated population is seen to exhibit a steepening of the slope of the spectral energy distribution from 20 to 148 GHz, with the trend continuing to 218 GHz. The ACT dust-dominated source population has a median spectral index of 3.7+0.62-0.86, and includes both local galaxies and sources with redshifts as great as 5.6. Dusty sources with no counterpart in existing catalogs likely belong to a recently discovered subpopulation of DSFGs lensed by foreground galaxies or galaxy groups.Comment: 13 pages, 8 figures, 4 table

    Imprints, [Vol. 5]

    Get PDF
    This 1989 edition includes winners of the T. E. Ferguson writing Contest, two honorable mentions, and a number of other entries that we felt deserved to be published. I would like to give special thanks to all the judges of the Ferguson Writing Contest who helped make this publication possible, and especially to Dr. Patricia Russell, who one again proved to be an invaluable asset. Her dedication and love for the organization and all it stands for has made this one of the most successful years ever.https://scholarworks.sfasu.edu/imprints/1000/thumbnail.jp
    • …
    corecore